Generalized neuron: Feedforward and recurrent architectures

نویسندگان

  • Raghavendra V. Kulkarni
  • Ganesh K. Venayagamoorthy
چکیده

Feedforward neural networks such as multilayer perceptrons (MLP) and recurrent neural networks are widely used for pattern classification, nonlinear function approximation, density estimation and time series prediction. A large number of neurons are usually required to perform these tasks accurately, which makes the MLPs less attractive for computational implementations on resource constrained hardware platforms. This paper highlights the benefits of feedforward and recurrent forms of a compact neural architecture called generalized neuron (GN). This paper demonstrates that GN and recurrent GN (RGN) can perform good classification, nonlinear function approximation, density estimation and chaotic time series prediction. Due to two aggregation functions and two activation functions, GN exhibits resilience to the nonlinearities of complex problems. Particle swarm optimization (PSO) is proposed as the training algorithm for GN and RGN. Due to a small number of trainable parameters, GN and RGN require less memory and computational resources. Thus, these structures are attractive choices for fast implementations on resource constrained hardware platforms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feedforward Approximations to Dynamic Recurrent Network Architectures

Recurrent neural network architectures can have useful computational properties, with complex temporal dynamics and input-sensitive attractor states. However, evaluation of recurrent dynamic architectures requires solving systems of differential equations, and the number of evaluations required to determine their response to a given input can vary with the input or can be indeterminate altogeth...

متن کامل

Stability criteria for three-layer locally recurrent networks

The paper deals with a discrete-time recurrent neural network designed with dynamic neuron models. Dynamics are reproduced within each single neuron, hence the considered network is a locally recurrent globally feedforward. In the paper, conditions for global stability of the neural network considered are derived using the Lyapunov’s second method.

متن کامل

Forecasting Sunspot Numbers with Neural Networks

This paper presents a feedforward neural network approach to sunspot forecasting. The sunspot series were analyzed with feedforward neural networks, formalized based on statistical models. The statistical models were used as comparison models along with recurrent neural networks. The feedforward networks had 24 inputs (depending on the number of predictor variables), one hidden layer with 20 ...

متن کامل

Recurrent Network Models of Sequence Generation and Memory

Sequential activation of neurons is a common feature of network activity during a variety of behaviors, including working memory and decision making. Previous network models for sequences and memory emphasized specialized architectures in which a principled mechanism is pre-wired into their connectivity. Here we demonstrate that, starting from random connectivity and modifying a small fraction ...

متن کامل

Architectural Complexity Measures of Recurrent Neural Networks

In this paper, we systematically analyse the connecting architectures of recurrent neural networks (RNNs). Our main contribution is twofold: first, we present a rigorous graphtheoretic framework describing the connecting architectures of RNNs in general. Second, we propose three architecture complexity measures of RNNs: (a) the recurrent depth, which captures the RNN’s over-time nonlinear compl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 22 7  شماره 

صفحات  -

تاریخ انتشار 2009